36,799 research outputs found

    Maximal theorems and square functions for analytic operators on Lp-spaces

    Full text link
    Let T : Lp --> Lp be a contraction, with p strictly between 1 and infinity, and assume that T is analytic, that is, there exists a constant K such that n\norm{T^n-T^{n-1}} < K for any positive integer n. Under the assumption that T is positive (or contractively regular), we establish the boundedness of various Littlewood-Paley square functions associated with T. As a consequence we show maximal inequalities of the form \norm{\sup_{n\geq 0}\, (n+1)^m\bigl |T^n(T-I)^m(x) \bigr |}_p\,\lesssim\, \norm{x}_p, for any nonnegative integer m. We prove similar results in the context of noncommutative Lp-spaces. We also give analogs of these maximal inequalities for bounded analytic semigroups, as well as applications to R-boundedness properties

    Strong q-variation inequalities for analytic semigroups

    Get PDF
    Let T : Lp --> Lp be a positive contraction, with p strictly between 1 and infinity. Assume that T is analytic, that is, there exists a constant K such that \norm{T^n-T^{n-1}} < K/n for any positive integer n. Let q strictly betweeen 2 and infinity and let v^q be the space of all complex sequences with a finite strong q-variation. We show that for any x in Lp, the sequence ([T^n(x)](\lambda))_{n\geq 0} belongs to v^q for almost every \lambda, with an estimate \norm{(T^n(x))_{n\geq 0}}_{Lp(v^q)}\leq C\norm{x}_p. If we remove the analyticity assumption, we obtain a similar estimate for the ergodic averages of T instead of the powers of T. We also obtain similar results for strongly continuous semigroups of positive contractions on Lp-spaces

    NPRF: A Neural Pseudo Relevance Feedback Framework for Ad-hoc Information Retrieval

    Full text link
    Pseudo-relevance feedback (PRF) is commonly used to boost the performance of traditional information retrieval (IR) models by using top-ranked documents to identify and weight new query terms, thereby reducing the effect of query-document vocabulary mismatches. While neural retrieval models have recently demonstrated strong results for ad-hoc retrieval, combining them with PRF is not straightforward due to incompatibilities between existing PRF approaches and neural architectures. To bridge this gap, we propose an end-to-end neural PRF framework that can be used with existing neural IR models by embedding different neural models as building blocks. Extensive experiments on two standard test collections confirm the effectiveness of the proposed NPRF framework in improving the performance of two state-of-the-art neural IR models.Comment: Full paper in EMNLP 201

    Document Clustering Based On Max-Correntropy Non-Negative Matrix Factorization

    Full text link
    Nonnegative matrix factorization (NMF) has been successfully applied to many areas for classification and clustering. Commonly-used NMF algorithms mainly target on minimizing the l2l_2 distance or Kullback-Leibler (KL) divergence, which may not be suitable for nonlinear case. In this paper, we propose a new decomposition method by maximizing the correntropy between the original and the product of two low-rank matrices for document clustering. This method also allows us to learn the new basis vectors of the semantic feature space from the data. To our knowledge, we haven't seen any work has been done by maximizing correntropy in NMF to cluster high dimensional document data. Our experiment results show the supremacy of our proposed method over other variants of NMF algorithm on Reuters21578 and TDT2 databasets.Comment: International Conference of Machine Learning and Cybernetics (ICMLC) 201
    • …
    corecore